AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
DeBERTa architecture optimization

# DeBERTa architecture optimization

Albertina 1b5 Portuguese Ptbr Encoder
MIT
Albertina 1.5B PTBR is a foundational large language model for the Brazilian Portuguese variant. It is an encoder belonging to the BERT family, based on the Transformer neural network architecture and developed on the basis of the DeBERTa model.
Large Language Model Transformers Other
A
PORTULAN
83
4
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase